Facebook under pressure to improve livestream moderation after New Zealand mosque attacks

Facebook has removed 1.5 million videos globally of the New Zealand mosque attack, livestreamed by gunman Brenton Tarrant, above, in the first 24 hours after the attack. (Handout/AFP)

CHRISTCHURCH, New Zealand: Facebook is facing pressure from New Zealand鈥檚 advertising industry and the country鈥檚 Privacy Commissioner for its role in distributing footage of the Christchurch mosque attacks.

The attacks, which killed 50 people, were live-streamed for almost 17 minutes on Facebook. the attack on two Christchurch mosques.

鈥淥ur concern as an industry is that live-streaming of these events becomes the new normal,鈥� said Paul Head, chief executive of New Zealand鈥檚 Commercial Communications Council, which represents the country鈥檚 advertising agencies.

鈥淲e鈥檙e asking all of the platforms鈥� to take immediate steps to either put in place systems, processes, algorithms or artificial intelligence that stops this kind of event,鈥� he said.

Lindsay Mouat, chief executive of the Association of New Zealand Advertisers, confirmed some of New Zealand鈥檚 鈥渧ery largest companies鈥� were making changes to their advertising plans in light of the mosque shootings.

Both Head and Mouat agreed social media companies must consider temporarily removing live-streaming capabilities if they were unable to moderate the content.

鈥淭his simply cannot be allowed to happen again,鈥� Head said.

New Zealand鈥檚 Privacy Commissioner John Edwards said it was appalling that Facebook allowed the gunman to live-stream the attack for 17 minutes.

鈥淭here鈥檚 no guarantee the same thing won鈥檛 happen tomorrow,鈥� Edwards said.

鈥淚t is simply incomprehensible and unacceptable that Facebook cannot prevent that kind of content being streamed to the world,鈥� he said.

University of Auckland Computer Science lecturer Dr Paul Ralph said it was extremely difficult for Facebook to implement an automated system to identify a live video showing a violent crime.

鈥淔acebook should never have launched a live-streaming service if they didn鈥檛 have a method of stopping a video ... of a terrorist act,鈥� Dr Ralph said.

He penned an open letter, published on noted.co.nz, calling on Facebook and YouTube to confront their role in terrorism.

Live-streaming was 鈥渁 feature that should have never been launched鈥�, Dr Ralph said.

In a statement, Facebook said it was working around the clock to prevent the shooting video showing up on the platform.

They confirmed the video was uploaded to Facebook 1.5 million times, but 1.2 million of those were stopped at upload, meaning they were never published.